266 research outputs found
Efficient Antihydrogen Detection in Antimatter Physics by Deep Learning
Antihydrogen is at the forefront of antimatter research at the CERN
Antiproton Decelerator. Experiments aiming to test the fundamental CPT symmetry
and antigravity effects require the efficient detection of antihydrogen
annihilation events, which is performed using highly granular tracking
detectors installed around an antimatter trap. Improving the efficiency of the
antihydrogen annihilation detection plays a central role in the final
sensitivity of the experiments. We propose deep learning as a novel technique
to analyze antihydrogen annihilation data, and compare its performance with a
traditional track and vertex reconstruction method. We report that the deep
learning approach yields significant improvement, tripling event coverage while
simultaneously improving performance by over 5% in terms of Area Under Curve
(AUC)
Parameterized Machine Learning for High-Energy Physics
We investigate a new structure for machine learning classifiers applied to
problems in high-energy physics by expanding the inputs to include not only
measured features but also physics parameters. The physics parameters represent
a smoothly varying learning task, and the resulting parameterized classifier
can smoothly interpolate between them and replace sets of classifiers trained
at individual values. This simplifies the training process and gives improved
performance at intermediate values, even for complex problems requiring deep
learning. Applications include tools parameterized in terms of theoretical
model parameters, such as the mass of a particle, which allow for a single
network to provide improved discrimination across a range of masses. This
concept is simple to implement and allows for optimized interpolatable results.Comment: For submission to PR
Learning and exploiting mixed variable dependencies with a model-based EA
Mixed-integer optimization considers problems with both discrete and continuous variables. The ability to learn and process problem structure can be of paramount importance for optimization, particularly when faced with black-box optimization (BBO) problems, where no structural knowledge is known a priori. For such cases, model-based Evolutionary Algorithms (EAs) have been very successful in the fields of discrete and continuous optimization. In this paper, we present a model-based EA which integrates techniques from the discrete and continuous domains in order to tackle mixed-integer problems. We furthermore introduce the novel mechanisms to learn and exploit mixed-variable dependencies. Previous approaches only learned dependencies explicitly in either the discrete or the continuous domain. The potential usefulness of addressing mixed dependencies directly is assessed by empirically analyzing algorithm performance on a selection of mixed-integer problems with different types of variable interactions. We find substantially improved, scalable performance on problems that exhibit mixed dependencies
- …